翻訳と辞書
Words near each other
・ Software fault tolerance
・ Softly, as I Leave You
・ Softly, as I Leave You (album)
・ Softly, as I Leave You (song)
・ Softly, as in a Morning Sunrise
・ Softly, Softly
・ Softly, Softly (film)
・ Softly, Softly (song)
・ Softly, Softly (TV series)
・ SoftMaker
・ SoftMaker Office
・ SoftMaker Presentations
・ SoftMan Products Co. v. Adobe Systems Inc.
・ Softmax
・ Softmax (game developer)
Softmax function
・ Softmod
・ Softmodem
・ Softnotes
・ Softnyx
・ Softonic.com
・ SoftPC
・ Softpedia
・ Softphone
・ Softporn Adventure
・ Softpress
・ Softpro
・ SoftQuad Software
・ SoftRAM
・ Softree Technical Systems


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

Softmax function : ウィキペディア英語版
Softmax function
In mathematics, in particular probability theory and related fields, the softmax function, or normalized exponential,〔 is a generalization of the logistic function that "squashes" a -dimensional vector \mathbf of arbitrary real values to a -dimensional vector \sigma(\mathbf) of real values in the range (0, 1) that add up to 1. The function is given by
:\sigma(\mathbf)_j = \frac^K e^}    for ''j'' = 1, ..., ''K''.
The softmax function is the gradient-log-normalizer of the categorical probability distribution. For this reason, the softmax function is used in various probabilistic multiclass classification methods including multinomial logistic regression, multiclass linear discriminant analysis, naive Bayes classifiers and artificial neural networks.〔ai-faq (What is a softmax activation function? )〕 Specifically, in multinomial logistic regression and linear discriminant analysis, the input to the function is the result of distinct linear functions, and the predicted probability for the 'th class given a sample vector is:
:P(y=j|\mathbf) = \frac\mathbf_j}}^\mathsf\mathbf_k}}
This can be seen as the composition of linear functions \mathbf \mapsto \mathbf^\mathsf\mathbf_1, \ldots, \mathbf \mapsto \mathbf^\mathsf\mathbf_K and the softmax function (where \mathbf^\mathsf\mathbf denotes the inner product of \mathbf and \mathbf).
== Artificial neural networks ==
In neural network simulations, the softmax function is often implemented at the final layer of a network used for classification. Such networks are then trained under a log loss (or cross-entropy) regime, giving a non-linear variant of multinomial logistic regression.
Since the function maps a vector and a specific index ''i'' to a real value, the derivative needs to take the index into account:
: \frac\sigma(\textbf, i) = \dots = \sigma(\textbf, i)(\delta_ - \sigma(\textbf, k))
Here, the Kronecker delta is used for simplicity (cf. the derivative of a sigmoid function, being expressed via the function itself).
See Multinomial logit for a probability model which uses the softmax activation function.

抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「Softmax function」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.